Web Survey Bibliography
It is common survey practice to convert a series of yes/no (forced-choice) items in telephone surveys to check-all-that-apply items in web and mail surveys. However, relatively little is known about how these different question formats may influence answers. This paper reports results from two web experiments and a comparison paper experiment in which nine different questions, varying in substantive topic and type (opinion-based and fact/behavior-based) were asked in 16 experimental comparisons of the check-all and forced-choice formats. Our purpose was to determine whether this change in question format influenced the number of response options marked affirmatively within each question and why any differences might occur. Results revealed that in every instance respondents marked significantly more items in the forced-choice format than in the check-all format. Given these results, detailed analyses of response patterns within questions, answering time, and alternative wording structures of questions were undertaken to examine which of three theories (satisficing, depth of processing, and acquiescence) best accounted for the response differences across question formats. These analyses indicated that the forced-choice format appears to invoke deeper processing and to eliminate satisficing behavior that occurs among some respondents to the check-all format, but that acquiescence does not seem to be an issue in the forcedchoice format. Thus, it appears that the use of the forced-choice question format is a desirable alternative to the use of the check-all question format for multiple answer questions. In addition, the findings reported here give ample reason to be concerned about the current practice of automatically converting items from the forced-choice format to the check-all format or vise versa when switching between telephone and paper or web surveys.
Conference program
Web survey bibliography - Stern, M. J. (13)
- An Examination of How Survey Mode Affect Eligibility, Response and Health Condition Reporting Rates...; 2016; Stern, M. J.; Ghandour, R.
- Mode and Eligibility Rates in a Dual-mode Web and Mail Survey ; 2016; Ventura, I.; Bilgen, I.; Stern, M. J.
- Can Using a Mixed Mode Approach Improve the Representativeness and Data Quality in Panel Surveys?; 2016; Stern, M. J.
- Spatial Modeling through GIS to Reveal Error Potent ial in Survey Data: Where, What and How Much ; 2016; English, N.; Ventura, I.; Bilgen, I.; Stern, M. J.
- Where Does the Platform Matter: The Impact of Geographic Clustering in Device Ownership and Internet...; 2015; Bilgen, I.; English, N.; Stern, M. J.; Ventura, I.
- Comparing Field and Laboratory Usability Tests to Assess the Consistency and Mistakes in Web Survey...; 2015; Croen, A.; Gonzales, N.; Ghandour, R.; Stern, M. J.
- Question Grouping and Matrices in Web Surveys: Using Response and Auxiliary Data to Examine Question...; 2014; Bilgen, I., Stern, M. J.
- Can Google Consumer Surveys Help Pre-Test Alternative Versions of a Survey Question?: A Comparison of...; 2013; Stern, M. J., Welch, W. W.
- How Do Different Sampling Techniques Perform in a Web-Only Survey? Results From a Comparison of a Random...; 2013; Bilgen, I., Stern, M. J., Wolter, K.
- Are Response Rates to a Web-Only Survey Spatially Clustered?; 2013; Fiorio, L., Stern, M. J., English, N., Bilgen, I., Curtis, B.
- How Representative are Google Consumer Surveys?: Results From an Analysis of a Google Consumer Survey...; 2013; Krishnamurty, P., Tanenbaum, E., Stern, M. J.
- The effects of item saliency and question design on measurement error in a self-administered survey; 2012; Stern, M. J., D., Mendez, J. D.Smyth, J. D.
- Comparing Check-All and Forced-Choice Question Formats in Web Surveys: The Role of Satisficing, Depth...; 2005; Smyth, J. D., Dillman, D. A., Christian, L. M., Stern, M. J.